Catalytic Majorization and $$\ell_p$$ Norms
نویسندگان
چکیده
منابع مشابه
Catalytic majorization and ` p norms
An important problem in quantum information theory is the mathematical characterization of the phenomenon of quantum catalysis: when can the surrounding entanglement be used to perform transformations of a jointly held quantum state under LOCC (local operations and classical communication) ? Mathematically, the question amounts to describe, for a fixed vector y, the set T (y) of vectors x such ...
متن کاملRobust blind methods using $\ell_p$ quasi norms
It was shown in a previous work that some blind methods can be made robust to channel order overmodeling by using the l1 or lp quasi-norms. However, no theoretical argument has been provided to support this statement. In this work, we study the robustness of subspace blind based methods using l1 or lp quasi-norms. For the l1 norm, we provide the sufficient and necessary condition that the chann...
متن کاملStochastic domination for iterated convolutions and catalytic majorization
We study how iterated convolutions of probability measures compare under stochastic domination. We give necessary and sufficient conditions for the existence of an integer n such that μ is stochastically dominated by ν for two given probability measures μ and ν. As a consequence we obtain a similar theorem on the majorization order for vectors in R. In particular we prove results about catalysi...
متن کاملCorrespondence between probabilistic norms and fuzzy norms
In this paper, the connection between Menger probabilistic norms and H"{o}hle probabilistic norms is discussed. In addition, the correspondence between probabilistic norms and Wu-Fang fuzzy (semi-) norms is established. It is shown that a probabilistic norm (with triangular norm $min$) can generate a Wu-Fang fuzzy semi-norm and conversely, a Wu-Fang fuzzy norm can generate a probabilistic norm.
متن کاملRényi divergence and majorization
Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications in Mathematical Physics
سال: 2007
ISSN: 0010-3616,1432-0916
DOI: 10.1007/s00220-007-0382-4